Homework 4


Question 1
?/? point (graded)

0 [q1.1]
1 [q1.2]
0 [q1.3]
1 [q1.4]
0 [q1.5]
1 [q1.6]

q1.1 =

q1.2 =

q1.3 =

q1.4 =

q1.5 =

q1.6 =


Question 2
?/? point (graded)

Your answers will be evaluated to 4 decimal places.

coefficient value
a [q2.1]
b [q2.2]
c [q2.3]
d [q2.4]
0 [q2.5]
1 [q2.6]

q2.1 =

q2.2 =

q2.3 =

q2.4 =

q2.5 =

q2.6 =


Question 3
?/? point (graded)

Consider this HMM.

The prior probability , dynamics model , and sensor model are as follows:

If the , what is the most likely explanation


Question 4
?/? point (graded)

The Viterbi algorithm finds the most probable sequence of hidden states , given a sequence of observations . For the HMM structure above, which of the following probabilities are maximized by the sequence of states returned by the Viterbi algorithm? Select all correct option(s).


Question 5
?/? point (graded)

Consider the following graph, where W1 and W2 can be either be R or S, and I1 and I2 can be either be T or F:

The conditional probability distributions are given by:

Now we want to try approximate inference through sampling. Applying likelihood weighting, suppose we generate the following six samples given the evidence I1 = T and I2 = F:

Then the weight of the first sample (S, T, R, F) is

The result from likelihood weighting is: (Your answers will be evaluated to 4 decimal places.)


Question 6
?/? point (graded)

After observing step of particle filtering, the particles and its weight are as follow:

Fill in the weighted sample distribution you used in the resampling step. Your answers will be evaluated to 4 decimal places.

P'(A) =

P'(B) =

P'(C) =

P'(D) =